Unsupervised Skeleton Extraction and Motion Capture from Kinect Video via 3D Deformable Matching

نویسندگان

  • Quanshi Zhang
  • Xuan Song
  • Xiaowei Shao
  • Ryosuke Shibasaki
  • Huijing Zhao
چکیده

This paper presents a novel method to extract skeletons of complex articulated objects from 3D point cloud sequences collected by the Kinect. Our approach is more robust than the traditional video-based and stereobased approaches, as the Kinect directly provides 3D information without any markers, 2D-to-3D-transition assumptions, and feature point extraction. We track all the raw 3D points on the object, and utilize the point trajectories to determine the object skeleton. The point tracking is achieved by the 3D non-rigid matching based on the Markov Random Field (MRF) Deformation Model. To reduce the large computational cost of the non-rigid matching, a coarse-to-fine procedure is proposed. To the best of our knowledge, this is the first to extract skeletons of highly deformable objects from 3D point cloud sequences by point tracking. Experiments prove our method’s good performance, and the extracted skeletons are successfully applied to the motion capture.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Human Gait Gender Classification using 3D Discrete Wavelet Transform Feature Extraction

Feature extraction for gait recognition has been created widely. The ancestor for this task is divided into two parts, model based and free-model based. Model-based approaches obtain a set of static or dynamic skeleton parameters via modeling or tracking body components such as limbs, legs, arms and thighs. Model-free approaches focus on shapes of silhouettes or the entire movement of physical ...

متن کامل

3D Skeleton model derived from Kinect Depth Sensor Camera and its application to walking style quality evaluations

Feature extraction for gait recognition has been created widely. The ancestor for this task is divided into two parts, model based and free-model based. Model-based approaches obtain a set of static or dynamic skeleton parameters via modeling or tracking body components such as limbs, legs, arms and thighs. Model-free approaches focus on shapes of silhouettes or the entire movement of physical ...

متن کامل

3D Modeling for Capturing Human Motion from Monocular Video

This paper presents an approach to build a 3D human-body model from uncalibrated monocular video sequences. Using an orthographic projection model for the camera and a generic human body model, we constitute a 2D-3D conversion scheme to generate a 3D skeletal model from the 2D image domain. The complete 3D geometry and kinematics information of a human being can be extracted without any a-prior...

متن کامل

Modélisation anatomique utilisateur-spécifique et animation temps-réel. Application à l'apprentissage de l'anatomie. (User-specific real-time registration and tracking applied to anatomy learning)

titre : User-specific real-time registration and tracking applied to anatomy learning. To make the complex task of anatomy learning easier, there exist many ways to represent and structure anatomical knowledge (drawings, books, cadaver dissections, 3D models, etc.). However, it may be difficult from these static media to understand and analyze anatomy motion, which is essential for medicine stu...

متن کامل

Tracking facial features in video sequences using a deformable model-based approach

This paper addresses the issue of computer vision-based face motion capture as an alternative to physical sensor-based technologies. The proposed method combines deformable template-based tracking of mouth and eyes in arbitrary video sequences with a single speaking person with a global 3D head pose estimation procedure yielding robust initializations. Mathematical principles underlying deforma...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012